Statistical Convergence of Kernel CCA

نویسندگان

  • Kenji Fukumizu
  • Francis R. Bach
  • Arthur Gretton
چکیده

While kernel canonical correlation analysis (kernel CCA) has been applied in many problems, the asymptotic convergence of the functions estimated from a finite sample to the true functions has not yet been established. This paper gives a rigorous proof of the statistical convergence of kernel CCA and a related method (NOCCO), which provides a theoretical justification for these methods. The result also gives a sufficient condition on the decay of the regularization coefficient in the methods to ensure convergence.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Statistical Consistency of Kernel Canonical Correlation Analysis

While kernel canonical correlation analysis (CCA) has been applied in many contexts, the convergence of finite sample estimates of the associated functions to their population counterparts has not yet been established. This paper gives a mathematical proof of the statistical convergence of kernel CCA, providing a theoretical justification for the method. The proof uses covariance operators defi...

متن کامل

An Operator Viewpoint to Analysis of Conditional Kernel Canonical Correlation

Kernel canonical correlation analysis (CCA) is a nonlinear extension of CCA, which aims at extracting information shared by two random variables. In this paper, a new notion of conditional kernel CCA is introduced. Conditional kernel CCA aims at analyzing the effect of variable Z to the dependence between X and Y . Rates of convergence of an empirical normalized conditional cross-covariance ope...

متن کامل

Almost Sure Convergence of Kernel Bivariate Distribution Function Estimator under Negative Association

Let {Xn ,n=>1} be a strictly stationary sequence of negatively associated random variables, with common distribution function F. In this paper, we consider the estimation of the two-dimensional distribution function of (X1, Xk+1) for fixed $K /in N$ based on kernel type estimators. We introduce asymptotic normality and properties and moments. From these we derive the optimal bandwidth...

متن کامل

Ridge-Penalty Regularization for Kernel-CCA

CCA and Kernel-CCA are powerful statistical tools that have been successfully employed for feature extraction. However, when working in high-dimensional signal spaces, care has to be taken to avoid overfitting. This paper discusses the influence of ridge penalty regularization on kernel-CCA by relating it to multivariate linear regression(MLR) and partial least squares(PLS). Experimental result...

متن کامل

Using articulatory measurements to learn better acoustic features

We summarize recent work on learning improved acoustic features, using articulatory measurements that are available for training but not at test time. The goal is to improve recognition using articulatory information, but without explicitly solving the difficult acoustics-to-articulation inversion problem. We formulate the problem as learning a (linear or nonlinear) transformation of standard a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005